Journal "Logical Methods in CS"

From the Proof Theory mailing list: the Logical Methods in Computer Science journal (LMCS) has been launched, with the message from the editors:

Dear Colleague:

We are writing to inform you about the progress of the open-access,
online journal "Logical Methods in Computer Science," which has recently
benefited from a freshly designed web site, see:

                   http://www.lmcs-online.org

In the first year of its existence, the journal received 75 submissions:
21 were accepted and 22 declined (the rest are still in the editorial
process). The first issue is complete, and we anticipate
that will be three in all by the end of the calendar year. Our eventual
aim is to publish four issues per year. We also publish Special Issues:
to date, three are in progress, devoted to selected papers from LICS
2004, CAV 2005 and LICS 2005.

The average turn-around from submission to publication has been
7 months. This comprises a thorough refereeing and revision process:
every submission is refereed in the  normal way by two or more
referees, who apply high standards of quality.

We would encourage you to submit your best papers to Logical Methods in
Computer Science, and to encourage your colleagues to do so too.
There is a flier and a leaflet containing basic information about the
new journal on the homepage; we would appreciate your posting
and distributing them, or otherwise publicising the journal. We would
also appreciate any suggestions you may have on how we may improve the
journal.

Yours Sincerely,

Dana S. Scott (editor-in-chief)
Gordon D. Plotkin and Moshe Y. Vardi (managing editors)
Jiri Adamek (executive editor)

PS. I would have posted this to the front page, but these privileges have been lost, no doubt for incalcitrant inactivity...

Linear types for aliased resources

Linear types for aliased resources. Chris Hawblitzel.

Type systems that track aliasing can verify state-dependent program properties. For example, such systems can verify that a program does not access a resource after deallocating the resource. The simplest way to track aliasing is to use linear types, which on the surface appear to ban the aliasing of linear resources entirely. Since banning aliasing is considered too draconian for many practical programs, researchers have proposed type systems that allow limited forms of aliasing, without losing total control over state-dependent properties. This paper describes how to encode one such system, the capability calculus, using a type system based on plain linear types with no special support for aliasing. Given well-typed capability calculus source programs, the encodings produce well-typed target programs based on linear types. These encodings demonstrate that, contrary to common expectations, linear type systems can express aliasing of linear resources.

The prose is slightly confusing, but the code examples help.

Sections 1 & 2 should be enough for those who only want to understand the problem with aliasing, and get a glimpse of what linear types are.

What do you mean by studying "programming languages"?

Reading some of the recent threads, it occured to me that there's room for a new thread in the spirit of the Getting Started thread and threads like Explaining monads and Why type systems are interesting.

The issue I want to raise is what is at the focus of the study of programming languages. By this I mean, what is the thing (or things) programming language researchers study. I do not want to discuss what should be done, at the moment, but rather orient outsiders about the actual practice in the field.

Specifically, my aim is to explain which aspects of languages are studied, and to let people know how these research areas are called, and perhaps where to find more information about new results etc. I am not thinking about specific lines of research, but rather about the kinds of things researchers study.

My intention will hopefully be clearer after reading my answer to the question in the title of this post:

Most of the time, programming language theory (PLT) is concerned with the semantics (i.e., meaning) of specific programming constructs. Much less outstanding research is done on other aspects, such as syntax, implementation and runtime issues, and languages as wholes (compared to the study indvidual constructs).

This is, of course, a very incomplete answer to the question. Let the discussion begin...

C#: Yesterday, Today, and Tomorrow: An Interview with Anders Hejlsberg

After Larry, it is appropriate to read Anders who makes a point of saying that in designing C# they try to make sure that there are not multiple ways of doing things...

I should point out that saying it's all syntax in the end isn't the same as saying that languages are syntactic sugar. The term "Syntactic sugar" is often misused. It is a technical term, with precise meaning ;-)

Personally, I find this sort of humour bothering: Oh, absolutely. And, you know, honestly, first of all, let's give credit where credit is due. I am not inventing anything completely new here. It's all based on this thing called lambda expressions or lambda calculus or whatever, which has existed in the functional programming space for decades. But somehow, that has never really seen the light of day in a mainstream programming language.

Imagine a doctor (GP) talking about the Avian flu: "There's something called mutations or whatever, which has existed in biology for decades. But somehow, that has never really seen the light of day in general medical practice."

It's quite legitimate to mention that some of these ideas, while not original, were indeed not to be found in "mainstream languages" (even though, one could argue that C# isn't the first to try to do something about it). But why the dismissive "or whatver"? LC is part of the science of programming (and of CS in general), and to me that sounded like trying to make Philistines happy. I think all of us can try to raise the level of discussion in the field.

It's an interesting interview nonetheless, and Anders always sounds like a nice and reasonable person.

Larry Wal: State of the Onion

Larry is as amusing as you'd expect him to be.

Of particular intest is the endorsement of Pugs, and the fact that Haskell is mentioned by name.

Multigame A Very High Level Language for Describing Board Games

Multigame - A Very High Level Language for Describing Board Games, John W. Romein, Henri E. Bal, Dick Grune. First Annual ASCI Conference, 1995.

Languages with implicit parallelism are easier to program in than those with explicit parallelism, but finding and efficiently exploiting parallelism in general-purpose programming languages by parallelizing compilers is hard. A compiler for a Very High Level Language, designed for a specific application domain, has more knowledge about its application domain and may use this knowledge to generate efficient parallel code without requiring the programmer to deal with explicit parallelism. To investigate this idea, we designed Multigame, a Very High Level Language for describing board games. A Multigame program is a formal description of the rules of a game, from which the Multigame compiler automatically generates a parallel game playing program.

An amusing DSL, and an interesting investigation of implicit parallelism.

Also see this later paper.

It would be nice to find a downloadable implementation, by the way.

Modular Checking for Buffer Overflows in the Large

Modular Checking for Buffer Overflows in the Large. Brian Hackett; Manuvir Das; Daniel Wang; Zhe Yang.

We describe an ongoing project, the deployment of a modular checker to statically find and prevent every buffer overflow in future versions of a Microsoft product. Lightweight annotations specify requirements for safely using each buffer, and functions are checked individually to ensure they obey these requirements and do not overflow. To date over 400,000 annotations have been added to specify buffer usage in the source code for this product, of which over 150,000 were automatically inferred, and over 3,000 potential buffer overflows have been found and fixed.

Good to know someone is doing something about buffer overflows...

Commercial Users of Functional Programming (CUFP)

(via the LtU forum)

The presentations aren't online unfortunately, but it's good to know about this meeting.

Mechanizing Language Definitions

The problem: Most languages don't have formal specifications.

The solution (maybe): Make specification easier by using mechanized tools (for example, use Twelf).

The presentation: here (Robert Harper, ICFP'05).

The conclusion: You decide.

LINQ BOF at OOPSLA

On Wednesday October 19, 2005, Mads Torgersen, Amanda Silver, and yours truly will be presenting a BOF on LINQ in the Royal Palm Salon 1+2, 5:00 – 7:30pm during OOPSLA 2005 in the Town & Country Resort & Convention Center in San Diego, CA.

With three language geeks (one from VB land, one ex-academic from the Java side, and one ex-academic from the Haskell side) presenting, this should be a fun night.

Abstract
Modern applications operate on data in several different forms: Relational tables, XML documents, and in-memory objects. Each of these domains can have profound differences in semantics, data types, and capabilities, and much of the complexity in today's applications is the result of these mismatches. The future "Orcas" release of Visual Studio aims to unify the programming models through integrated query capabilities in C# and Visual Basic, a strongly typed data access framework, and an innovative API for manipulating and querying XML. This talk explains the ideas behind language integrated queries (LINQ) and discusses the language enhancements behind them.

Slides The slides for the presentation are here.